Statistical foundation of Variational Bayes neural networks

نویسندگان

چکیده

Despite the popularism of Bayesian neural networks (BNNs) in recent years, its use is somewhat limited complex and big data situations due to computational cost associated with full posterior evaluations. Variational Bayes (VB) provides a useful alternative circumvent time complexity generation samples from true using Markov Chain Monte Carlo (MCMC) techniques. The efficacy VB methods well established machine learning literature. However, potential broader impact hindered lack theoretical validity statistical perspective. In this paper, we establish fundamental result consistency for mean-field variational (VP) feed-forward artificial network model. paper underlines conditions needed guarantee that VP concentrates around Hellinger neighborhoods density function. Additionally, role scale parameter influence on convergence rates has also been discussed. mainly relies two results (1) rate at which grows (2) Kullback-Leibler (KL) distance between grows. theory guideline building prior distributions BNNs along an assessment accuracy corresponding implementation.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adversarial Variational Bayes: Unifying Variational Autoencoders and Generative Adversarial Networks

Variational Autoencoders (VAEs) are expressive latent variable models that can be used to learn complex probability distributions from training data. However, the quality of the resulting model crucially relies on the expressiveness of the inference model. We introduce Adversarial Variational Bayes (AVB), a technique for training Variational Autoencoders with arbitrarily expressive inference mo...

متن کامل

Variational Bayes Solution of Linear Neural Networks and Its Generalization Performance

It is well known that in unidentifiable models, the Bayes estimation provides much better generalization performance than the maximum likelihood (ML) estimation. However, its accurate approximation by Markov chain Monte Carlo methods requires huge computational costs. As an alternative, a tractable approximation method, called the variational Bayes (VB) approach, has recently been proposed and ...

متن کامل

Auto-Encoding Variational Bayes

How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case. Our contributi...

متن کامل

Streaming Variational Bayes

Overview • Large, streaming data sets are increasingly the norm • Inference for Big Data has generally been non-Bayesian • Advantages of Bayes: complex models, coherent treatment of uncertainty, etc. We deliver: • SDA-Bayes, a framework for Streaming, Distributed, Asynchronous Bayesian inference • Experiments demonstrating streaming topic discovery with comparable predictive performance to non-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Networks

سال: 2021

ISSN: ['1879-2782', '0893-6080']

DOI: https://doi.org/10.1016/j.neunet.2021.01.027